PU-Transformer: Point Cloud Upsampling Transformer

نویسندگان

چکیده

Given the rapid development of 3D scanners, point clouds are becoming popular in AI-driven machines. However, cloud data is inherently sparse and irregular, causing significant difficulties for machine perception. In this work, we focus on upsampling task that intends to generate dense high-fidelity from input data. Specifically, activate transformer’s strong capability representing features, develop a new variant multi-head self-attention structure enhance both point-wise channel-wise relations feature map. addition, leverage positional fusion block comprehensively capture local context data, providing more position-related information about scattered points. As first transformer model introduced upsampling, demonstrate outstanding performance our approach by comparing with state-of-the-art CNN-based methods different benchmarks quantitatively qualitatively.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

PU-Net: Point Cloud Upsampling Network

Learning and analyzing 3D point clouds with deep networks is challenging due to the sparseness and irregularity of the data. In this paper, we present a data-driven point cloud upsampling technique. The key idea is to learn multilevel features per point and expand the point set via a multibranch convolution unit implicitly in feature space. The expanded feature is then split to a multitude of f...

متن کامل

Transformer proteins

Proteins are generally believed to adopt a unique fold, defined by their amino acid sequence, under specific environmental conditions. These unique structures, in turn, endow proteins with one specific function. However, not all proteins obey the “1 amino acid sequence → 1 fold → 1 function” scheme. Moonlighting proteins that adopt one distinct threedimensional structure but can accomplish two ...

متن کامل

Image Transformer

Image generation has been successfully cast as an autoregressive sequence generation or transformation problem. Recent work has shown that self-attention is an effective way of modeling textual sequences. In this work, we generalize a recently proposed model architecture based on self-attention, the Transformer, to a sequence modeling formulation of image generation with a tractable likelihood....

متن کامل

Transformer in Grid

The aim of this research article is to determine the way to install surge arresters close to a power transformer to provide protection against lightning overvoltage. Depending on the length of the cables used in the installation, the insulation levels in base insulators of surge arresters and bushings of transformers change according to the voltage they support. For validation purposes, the vol...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Lecture Notes in Computer Science

سال: 2023

ISSN: ['1611-3349', '0302-9743']

DOI: https://doi.org/10.1007/978-3-031-26319-4_20